Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 473
1.
Synapse ; 78(3): e22291, 2024 May.
Article En | MEDLINE | ID: mdl-38733105

Spinal serotonin enables neuro-motor recovery (i.e., plasticity) in patients with debilitating paralysis. While there exists time of day fluctuations in serotonin-dependent spinal plasticity, it is unknown, in humans, whether this is due to dynamic changes in spinal serotonin levels or downstream signaling processes. The primary objective of this study was to determine if time of day variations in spinal serotonin levels exists in humans. To assess this, intrathecal drains were placed in seven adults with cerebrospinal fluid (CSF) collected at diurnal (05:00 to 07:00) and nocturnal (17:00 to 19:00) intervals. High performance liquid chromatography with mass spectrometry was used to quantify CSF serotonin levels with comparisons being made using univariate analysis. From the 7 adult patients, 21 distinct CSF samples were collected: 9 during the diurnal interval and 12 during nocturnal. Diurnal CSF samples demonstrated an average serotonin level of 216.6 ± $ \pm $ 67.7 nM. Nocturnal CSF samples demonstrated an average serotonin level of 206.7 ± $ \pm $ 75.8 nM. There was no significant difference between diurnal and nocturnal CSF serotonin levels (p = .762). Within this small cohort of spine healthy adults, there were no differences in diurnal versus nocturnal spinal serotonin levels. These observations exclude spinal serotonin levels as the etiology for time of day fluctuations in serotonin-dependent spinal plasticity expression.


Circadian Rhythm , Serotonin , Humans , Serotonin/cerebrospinal fluid , Male , Adult , Female , Circadian Rhythm/physiology , Middle Aged , Spinal Cord/metabolism , Chromatography, High Pressure Liquid , Aged
2.
Inj Epidemiol ; 11(1): 18, 2024 May 13.
Article En | MEDLINE | ID: mdl-38741167

BACKGROUND: There is an epidemic of firearm injuries in the United States since the mid-2000s. Thus, we sought to examine whether hospitalization from firearm injuries have increased over time, and to examine temporal changes in patient demographics, firearm injury intent, and injury severity. METHODS: This was a multicenter, retrospective, observational cohort study of patients hospitalized with a traumatic injury to six US level I trauma centers between 1/1/2016 and 6/30/2022. ICD-10-CM cause codes were used to identify and describe firearm injuries. Temporal trends were compared for demographics (age, sex, race, insured status), intent (assault, unintentional, self-harm, legal intervention, and undetermined), and severity (death, ICU admission, severe injury (injury severity score ≥ 16), receipt of blood transfusion, mechanical ventilation, and hospital and ICU LOS (days). Temporal trends were examined over 13 six-month intervals (H1, January-June; H2, July-December) using joinpoint regression and reported as semi-annual percent change (SPC); significance was p < 0.05. RESULTS: Firearm injuries accounted for 2.6% (1908 of 72,474) of trauma hospitalizations. The rate of firearm injuries initially declined from 2016-H1 to 2018-H2 (SPC = - 4.0%, p = 0.002), followed by increased rates from 2018-H2 to 2020-H1 (SPC = 9.0%, p = 0.005), before stabilizing from 2020-H1 to 2022-H1 (0.5%, p = 0.73). NH black patients had the greatest hospitalization rate from firearm injuries (14.0%) and were the only group to demonstrate a temporal increase (SPC = 6.3%, p < 0.001). The proportion of uninsured patients increased (SPC = 2.3%, p = 0.02) but there were no temporal changes by age or sex. ICU admission rates declined (SPC = - 2.2%, p < 0.001), but ICU LOS increased (SPC = 2.8%, p = 0.04). There were no significant changes over time in rates of death (SPC = 0.3%), severe injury (SPC = 1.6%), blood transfusion (SPC = 0.6%), and mechanical ventilation (SPC = 0.6%). When examined by intent, self-harm injuries declined over time (SPC = - 4.1%, p < 0.001), assaults declined through 2019-H2 (SPC = - 5.6%, p = 0.01) before increasing through 2022-H1 (SPC = 6.5%, p = 0.01), while undetermined injuries increased through 2019-H1 (SPC = 24.1%, p = 0.01) then stabilized (SPC = - 4.5%, p = 0.39); there were no temporal changes in unintentional injuries or legal intervention. CONCLUSIONS: Hospitalizations from firearm injuries are increasing following a period of declines, driven by increases among NH Black patients. Trauma systems need to consider these changing trends to best address the needs of the injured population.

3.
JMIR Perioper Med ; 7: e52125, 2024 Apr 04.
Article En | MEDLINE | ID: mdl-38573737

BACKGROUND: Pip is a novel digital health platform (DHP) that combines human health coaches (HCs) and technology with patient-facing content. This combination has not been studied in perioperative surgical optimization. OBJECTIVE: This study's aim was to test the feasibility of the Pip platform for deploying perioperative, digital, patient-facing optimization guidelines to elective surgical patients, assisted by an HC, at predefined intervals in the perioperative journey. METHODS: We conducted an institutional review board-approved, descriptive, prospective feasibility study of patients scheduled for elective surgery and invited to enroll in Pip from 2.5 to 4 weeks preoperatively through 4 weeks postoperatively at an academic medical center between November 22, 2022, and March 27, 2023. Descriptive primary end points were patient-reported outcomes, including patient satisfaction and engagement, and Pip HC evaluations. Secondary end points included mean or median length of stay (LOS), readmission at 7 and 30 days, and emergency department use within 30 days. Secondary end points were compared between patients who received Pip versus patients who did not receive Pip using stabilized inverse probability of treatment weighting. RESULTS: A total of 283 patients were invited, of whom 172 (60.8%) enrolled in Pip. Of these, 80.2% (138/172) patients had ≥1 HC session and proceeded to surgery, and 70.3% (97/138) of the enrolled patients engaged with Pip postoperatively. The mean engagement began 27 days before surgery. Pip demonstrated an 82% weekly engagement rate with HCs. Patients attended an average of 6.7 HC sessions. Of those patients that completed surveys (95/138, 68.8%), high satisfaction scores were recorded (mean 4.8/5; n=95). Patients strongly agreed that HCs helped them throughout the perioperative process (mean 4.97/5; n=33). The average net promoter score was 9.7 out of 10. A total of 268 patients in the non-Pip group and 128 patients in the Pip group had appropriate overlapping distributions of stabilized inverse probability of treatment weighting for the analytic sample. The Pip cohort was associated with LOS reduction when compared to the non-Pip cohort (mean 2.4 vs 3.1 days; median 1.9, IQR 1.0-3.1 vs median 3.0, IQR 1.1-3.9 days; mean ratio 0.76; 95% CI 0.62-0.93; P=.009). The Pip cohort experienced a 49% lower risk of 7-day readmission (relative risk [RR] 0.51, 95% CI 0.11-2.31; P=.38) and a 17% lower risk of 30-day readmission (RR 0.83, 95% CI 0.30-2.31; P=.73), though these did not reach statistical significance. Both cohorts had similar 30-day emergency department returns (RR 1.06, 95% CI 0.56-2.01, P=.85). CONCLUSIONS: Pip is a novel mobile DHP combining human HCs and perioperative optimization content that is feasible to engage patients in their perioperative journey and is associated with reduced hospital LOS. Further studies assessing the impact on clinical and patient-reported outcomes from the use of Pip or similar DHPs HC combinations during the perioperative journey are required.

4.
Injury ; : 111523, 2024 Apr 09.
Article En | MEDLINE | ID: mdl-38614835

BACKGROUND: In patients with severe traumatic brain injury (TBI), clinicians must balance preventing venous thromboembolism (VTE) with the risk of intracranial hemorrhagic expansion (ICHE). We hypothesized that low molecular weight heparin (LMWH) would not increase risk of ICHE or VTE as compared to unfractionated heparin (UH) in patients with severe TBI. METHODS: Patients ≥ 18 years of age with isolated severe TBI (AIS ≥ 3), admitted to 24 level I and II trauma centers between January 1, 2014 to December 31, 2020 and who received subcutaneous UH and LMWH injections for chemical venous thromboembolism prophylaxis (VTEP) were included. Primary outcomes were VTE and ICHE after VTEP initiation. Secondary outcomes were mortality and neurosurgical interventions. Entropy balancing (EBAL) weighted competing risk or logistic regression models were estimated for all outcomes with chemical VTEP agent as the predictor of interest. RESULTS: 984 patients received chemical VTEP, 482 UH and 502 LMWH. Patients on LMWH more often had pre-existing conditions such as liver disease (UH vs LMWH 1.7 % vs. 4.4 %, p = 0.01), and coagulopathy (UH vs LMWH 0.4 % vs. 4.2 %, p < 0.001). There were no differences in VTE or ICHE after VTEP initiation. There were no differences in neurosurgical interventions performed. There were a total of 29 VTE events (3 %) in the cohort who received VTEP. A Cox proportional hazards model with a random effect for facility demonstrated no statistically significant differences in time to VTE across the two agents (p = 0.44). The LMWH group had a 43 % lower risk of overall ICHE compared to the UH group (HR = 0.57: 95 % CI = 0.32-1.03, p = 0.062), however was not statistically significant. CONCLUSION: In this multi-center analysis, patients who received LMWH had a decreased risk of ICHE, with no differences in VTE, ICHE after VTEP initiation and neurosurgical interventions compared to those who received UH. There were no safety concerns when using LMWH compared to UH. LEVEL OF EVIDENCE: Level III, Therapeutic Care Management.

5.
J Clin Med ; 13(8)2024 Apr 11.
Article En | MEDLINE | ID: mdl-38673475

Background: The objective of this study was to evaluate if imbalance influences complication rates, radiological outcomes, and patient-reported outcomes (PROMs) following adult spinal deformity (ASD) surgery. Methods: ASD patients with baseline and 2-year radiographic and PROMs were included. Patients were grouped according to whether they answered yes or no to a recent history of pre-operative loss of balance. The groups were propensity-matched by age, pelvic incidence-lumbar lordosis (PI-LL), and surgical invasiveness score. Results: In total, 212 patients were examined (106 in each group). Patients with gait imbalance had worse baseline PROM measures, including Oswestry disability index (45.2 vs. 36.6), SF-36 mental component score (44 vs. 51.8), and SF-36 physical component score (p < 0.001 for all). After 2 years, patients with gait imbalance had less pelvic tilt correction (-1.2 vs. -3.6°, p = 0.039) for a comparable PI-LL correction (-11.9 vs. -15.1°, p = 0.144). Gait imbalance patients had higher rates of radiographic proximal junctional kyphosis (PJK) (26.4% vs. 14.2%) and implant-related complications (47.2% vs. 34.0%). After controlling for age, baseline sagittal parameters, PI-LL correction, and comorbidities, patients with imbalance had 2.2-times-increased odds of PJK after 2 years. Conclusions: Patients with a self-reported loss of balance/unsteady gait have significantly worse PROMs and higher risk of PJK.

6.
Oecologia ; 204(4): 943-957, 2024 Apr.
Article En | MEDLINE | ID: mdl-38619585

Top carnivores can influence the structure of ecological communities, primarily through competition and predation; however, communities are also influenced by bottom-up forces such as anthropogenic habitat disturbance. Top carnivore declines will likely alter competitive dynamics within and amongst sympatric carnivore species. Increasing intraspecific competition is generally predicted to drive niche expansion and/or individual specialisation, while interspecific competition tends to constrain niches. Using stable isotope analysis of whiskers, we studied the effects of Tasmanian devil Sarcophilus harrisii declines upon the population- and individual-level isotopic niches of Tasmanian devils and sympatric spotted-tailed quolls Dasyurus maculatus subsp. maculatus. We investigated whether time since the onset of devil decline (a proxy for severity of decline) and landscape characteristics affected the isotopic niche breadth and overlap of devil and quoll populations. We quantified individual isotopic niche breadth for a subset of Tasmanian devils and spotted-tailed quolls and assessed whether between-site population niche variation was driven by individual-level specialisation. Tasmanian devils and spotted-tailed quolls demonstrated smaller population-level isotopic niche breadths with increasing human-modified habitat, while time since the onset of devil decline had no effect on population-level niche breadth or interspecific niche overlap. Individual isotopic niche breadths of Tasmanian devils and spotted-tailed quolls were narrower in human-modified landscapes, likely driving population isotopic niche contraction, however, the degree of individuals' specialisation relative to one another remained constant. Our results suggest that across varied landscapes, mammalian carnivore niches can be more sensitive to the bottom-up forces of anthropogenic habitat disturbance than to the top-down effects of top carnivore decline.


Ecosystem , Animals , Marsupialia , Humans , Carnivora
8.
Proc Natl Acad Sci U S A ; 121(12): e2307780121, 2024 Mar 19.
Article En | MEDLINE | ID: mdl-38466855

Coevolution is common and frequently governs host-pathogen interaction outcomes. Phenotypes underlying these interactions often manifest as the combined products of the genomes of interacting species, yet traditional quantitative trait mapping approaches ignore these intergenomic interactions. Devil facial tumor disease (DFTD), an infectious cancer afflicting Tasmanian devils (Sarcophilus harrisii), has decimated devil populations due to universal host susceptibility and a fatality rate approaching 100%. Here, we used a recently developed joint genome-wide association study (i.e., co-GWAS) approach, 15 y of mark-recapture data, and 960 genomes to identify intergenomic signatures of coevolution between devils and DFTD. Using a traditional GWA approach, we found that both devil and DFTD genomes explained a substantial proportion of variance in how quickly susceptible devils became infected, although genomic architectures differed across devils and DFTD; the devil genome had fewer loci of large effect whereas the DFTD genome had a more polygenic architecture. Using a co-GWA approach, devil-DFTD intergenomic interactions explained ~3× more variation in how quickly susceptible devils became infected than either genome alone, and the top genotype-by-genotype interactions were significantly enriched for cancer genes and signatures of selection. A devil regulatory mutation was associated with differential expression of a candidate cancer gene and showed putative allele matching effects with two DFTD coding sequence variants. Our results highlight the need to account for intergenomic interactions when investigating host-pathogen (co)evolution and emphasize the importance of such interactions when considering devil management strategies.


Communicable Diseases , Daunorubicin/analogs & derivatives , Facial Neoplasms , Marsupialia , Animals , Facial Neoplasms/genetics , Facial Neoplasms/veterinary , Genome-Wide Association Study , Marsupialia/genetics
9.
Neurosurgery ; 2024 Mar 29.
Article En | MEDLINE | ID: mdl-38551355

BACKGROUND AND OBJECTIVES: Nearly 30% of older adults presenting with isolated spine fractures will die within 1 year. Attempts to ameliorate this alarming statistic are hindered by our inability to identify relevant risk factors. The primary objective of this study was to develop a prediction model that identifies feasible targets to limit 1-year mortality. METHODS: This retrospective cohort study included 703 older adults (65 years or older) admitted to a level I trauma center with isolated spine fractures, without neural deficit, from January 2013 to January 2018. Multivariable analysis was used to select for independently significant patient demographics, frailty variables, injury metrics, and management decisions to incorporate into distinct logistic regression models predicting 1-year mortality. Variables were considered significant, if P < .05. RESULTS: Of the 703 older adults, 199 (28.3%) died after hospital discharge, but within 1 year of index trauma. Risk Analysis Index (RAI; odds ratio [OR]: 1.116; 95% CI: 1.087-1.149; P < .001) and ambulation requiring a cane (OR: 2.601; 95% CI: 1.151-5.799; P = .02) or walker (OR: 4.942; 95% CI: 2.698-9.196; P < .001), ie, frailty variables, were associated with increased odds of 1-year mortality. Spine trauma scales were not associated with 1-year mortality. Longer hospital stays (OR: 1.112; 95% CI: 1.034-1.196; P = .004) and nursing home discharge (OR: 3.881; 95% CI: 2.070-7.378; P < .001) were associated with increased odds, while discharge to rehab (OR: 0.361; 95% CI: 0.155-0.799; P = .014) decreased 1-year mortality odds. A "preinjury" regression model incorporating Risk Analysis Index and ambulation status resulted in an area under receiver operating characteristic curve (AUROCC) of 0.914 (95% CI: 0.863-0.965). A "postinjury" model incorporating Glasgow Coma Scale, hospital stay duration, and discharge disposition resulted in AUROCC of 0.746 (95% CI: 0.642-0.849). Combining elements of the preinjury and postinjury models into an "integrated model" produced an AUROCC of 0.908 (95% CI: 0.852-0.965). CONCLUSION: Preinjury frailty measures are most strongly associated with 1-year mortality outcomes in older adults with isolated spine fractures. Incorporating injury metrics or management decisions did not enhance predictive accuracy. Further work is needed to understand how targeting frailty may reduce mortality.

10.
Front Immunol ; 15: 1286352, 2024.
Article En | MEDLINE | ID: mdl-38515744

The world's largest extant carnivorous marsupial, the Tasmanian devil, is challenged by Devil Facial Tumor Disease (DFTD), a fatal, clonally transmitted cancer. In two decades, DFTD has spread across 95% of the species distributional range. A previous study has shown that factors such as season, geographic location, and infection with DFTD can impact the expression of immune genes in Tasmanian devils. To date, no study has investigated within-individual immune gene expression changes prior to and throughout the course of DFTD infection. To explore possible changes in immune response, we investigated four locations across Tasmania that differed in DFTD exposure history, ranging between 2 and >30 years. Our study demonstrated considerable complexity in the immune responses to DFTD. The same factors (sex, age, season, location and DFTD infection) affected immune gene expression both across and within devils, although seasonal and location specific variations were diminished in DFTD affected devils. We also found that expression of both adaptive and innate immune genes starts to alter early in DFTD infection and continues to change as DFTD progresses. A novel finding was that the lower expression of immune genes MHC-II, NKG2D and CD8 may predict susceptibility to earlier DFTD infection. A case study of a single devil with regressed tumor showed opposite/contrasting immune gene expression patterns compared to the general trends observed across devils with DFTD infection. Our study highlights the complexity of DFTD's interactions with the host immune system and the need for long-term studies to fully understand how DFTD alters the evolutionary trajectory of devil immunity.


Daunorubicin/analogs & derivatives , Facial Neoplasms , Marsupialia , Animals , Facial Neoplasms/genetics , Facial Neoplasms/veterinary , Immune System/pathology , Gene Expression , Marsupialia/genetics
11.
Health Technol Assess ; 28(10): 1-213, 2024 Mar.
Article En | MEDLINE | ID: mdl-38477237

Background: The indications for septoplasty are practice-based, rather than evidence-based. In addition, internationally accepted guidelines for the management of nasal obstruction associated with nasal septal deviation are lacking. Objective: The objective was to determine the clinical effectiveness and cost-effectiveness of septoplasty, with or without turbinate reduction, compared with medical management, in the management of nasal obstruction associated with a deviated nasal septum. Design: This was a multicentre randomised controlled trial comparing septoplasty, with or without turbinate reduction, with defined medical management; it incorporated a mixed-methods process evaluation and an economic evaluation. Setting: The trial was set in 17 NHS secondary care hospitals in the UK. Participants: A total of 378 eligible participants aged > 18 years were recruited. Interventions: Participants were randomised on a 1: 1 basis and stratified by baseline severity and gender to either (1) septoplasty, with or without turbinate surgery (n = 188) or (2) medical management with intranasal steroid spray and saline spray (n = 190). Main outcome measures: The primary outcome was the Sino-nasal Outcome Test-22 items score at 6 months (patient-reported outcome). The secondary outcomes were as follows: patient-reported outcomes - Nasal Obstruction Symptom Evaluation score at 6 and 12 months, Sino-nasal Outcome Test-22 items subscales at 12 months, Double Ordinal Airway Subjective Scale at 6 and 12 months, the Short Form questionnaire-36 items and costs; objective measurements - peak nasal inspiratory flow and rhinospirometry. The number of adverse events experienced was also recorded. A within-trial economic evaluation from an NHS and Personal Social Services perspective estimated the incremental cost per (1) improvement (of ≥ 9 points) in Sino-nasal Outcome Test-22 items score, (2) adverse event avoided and (3) quality-adjusted life-year gained at 12 months. An economic model estimated the incremental cost per quality-adjusted life-year gained at 24 and 36 months. A mixed-methods process evaluation was undertaken to understand/address recruitment issues and examine the acceptability of trial processes and treatment arms. Results: At the 6-month time point, 307 participants provided primary outcome data (septoplasty, n = 152; medical management, n = 155). An intention-to-treat analysis revealed a greater and more sustained improvement in the primary outcome measure in the surgical arm. The 6-month mean Sino-nasal Outcome Test-22 items scores were -20.0 points lower (better) for participants randomised to septoplasty than for those randomised to medical management [the score for the septoplasty arm was 19.9 and the score for the medical management arm was 39.5 (95% confidence interval -23.6 to -16.4; p < 0.0001)]. This was confirmed by sensitivity analyses and through the analysis of secondary outcomes. Outcomes were statistically significantly related to baseline severity, but not to gender or turbinate reduction. In the surgical and medical management arms, 132 and 95 adverse events occurred, respectively; 14 serious adverse events occurred in the surgical arm and nine in the medical management arm. On average, septoplasty was more costly and more effective in improving Sino-nasal Outcome Test-22 items scores and quality-adjusted life-years than medical management, but incurred a larger number of adverse events. Septoplasty had a 15% probability of being considered cost-effective at 12 months at a £20,000 willingness-to-pay threshold for an additional quality-adjusted life-year. This probability increased to 99% and 100% at 24 and 36 months, respectively. Limitations: COVID-19 had an impact on participant-facing data collection from March 2020. Conclusions: Septoplasty, with or without turbinate reduction, is more effective than medical management with a nasal steroid and saline spray. Baseline severity predicts the degree of improvement in symptoms. Septoplasty has a low probability of cost-effectiveness at 12 months, but may be considered cost-effective at 24 months. Future work should focus on developing a septoplasty patient decision aid. Trial registration: This trial is registered as ISRCTN16168569 and EudraCT 2017-000893-12. Funding: This award was funded by the National Institute for Health and Care Research (NIHR) Health Technology Assessment programme (NIHR award ref: 14/226/07) and is published in full in Health Technology Assessment; Vol. 28, No. 10. See the NIHR Funding and Awards website for further award information.


Septoplasty is an operation to straighten the septum, which is the partition wall between the nostrils inside the nose. Septoplasty can be used as a treatment for people who have a bent septum and symptoms of a blocked nose, such as difficulty sleeping and exercising. Medical management (a saltwater spray to clear the nose followed by a nose steroid spray) is an alternative treatment to septoplasty. The Nasal AIRway Obstruction Study (NAIROS) aimed to find out whether septoplasty or medical management is a better treatment for people with a bent septum and symptoms of a blocked nose. We recruited 378 patients with at least moderately severe nose symptoms from 17 hospitals in England, Scotland and Wales to take part in the NAIROS. Participants were randomly put into one of two groups: septoplasty or medical management. Participants' nose symptoms were measured both when they joined the study and after 6 months, using a questionnaire called the Sino-nasal Outcome Test-22 items. This questionnaire was chosen because patients reported that it included symptoms that were important to them. Other studies have shown that a 9-point change in the Sino-nasal Outcome Test-22 items score is significant. After 6 months, on average, people in the septoplasty group improved by 25 points, whereas people in the medical management group improved by 5 points. We saw improvement after septoplasty among patients with moderate symptoms, and among those with severe symptoms. Most patients who we spoke to after a septoplasty were happy with their treatment, but some would have liked more information about what to expect after their nose surgery. In the short term, septoplasty is more costly than medical management. However, over the longer term, taking into account all the costs and benefits of treatment, suggests that septoplasty would be considered good value for money for the NHS.


Nasal Obstruction , Adult , Humans , Nasal Obstruction/diagnosis , Nasal Obstruction/surgery , Treatment Outcome , Surveys and Questionnaires , Cost-Benefit Analysis , Nasal Septum/surgery , Steroids , Quality of Life
12.
Article En | MEDLINE | ID: mdl-38462731

STUDY DESIGN: Retrospective cohort. OBJECTIVE: To evaluate factors associated with the long-term durability of cost-effectiveness (CE) in ASD patients. BACKGROUND: A substantial increase in costs associated with the surgical treatment for adult spinal deformity (ASD) has given precedence to scrutinize the value and utility it provides. METHODS: We included 327 operative ASD patients with 5-year (5 Y) follow-up. Published methods were used to determine costs based on CMS.gov definitions and were based on the average DRG reimbursement rates. Utility was calculated using quality-adjusted life-years (QALY) utilizing the Oswestry Disability Index (ODI) converted to Short-Form Six-Dimension (SF-6D), with a 3% discount applied for its decline with life expectancy. The CE threshold of $150,000 was used for primary analysis. RESULTS: Major and minor complication rates were 11% and 47% respectively, with 26% undergoing reoperation by 5 Y. The mean cost associated with surgery was $91,095±$47,003, with a utility gain of 0.091±0.086 at 1Y, QALY gained at 2 Y of 0.171±0.183, and at 5 Y of 0.42±0.43. The cost per QALY at 2 Y was $414,885, which decreased to $142,058 at 5 Y.With the threshold of $150,000 for CE, 19% met CE at 2 Y and 56% at 5 Y. In those in which revision was avoided, 87% met cumulative CE till life expectancy. Controlling analysis depicted higher baseline CCI and pelvic tilt (PT) to be the strongest predictors for not maintaining durable CE to 5 Y (CCI OR: 1.821 [1.159-2.862], P=0.009) (PT OR: 1.079 [1.007-1.155], P=0.030). CONCLUSIONS: Most patients achieved cost-effectiveness after four years postoperatively, with 56% meeting at five years postoperatively. When revision was avoided, 87% of patients met cumulative cost-effectiveness till life expectancy. Mechanical complications were predictive of failure to achieve cost-effectiveness at 2 Y, while comorbidity burden and medical complications were at 5 Y.

13.
Lancet Rheumatol ; 6(4): e237-e246, 2024 Apr.
Article En | MEDLINE | ID: mdl-38423028

BACKGROUND: Osteoarthritis of the knee is a major cause of disability worldwide. Non-operative treatments can reduce the morbidity but adherence is poor. We hypothesised that adherence could be optimised if behavioural change was established in the preoperative period. Therefore, we aimed to assess feasibility, acceptability, and recruitment and retention rates of a preoperative package of non-operative care in patients awaiting knee replacement surgery. METHODS: We did an open-label, randomised controlled, feasibility trial in two secondary care centres in the UK. Eligible participants were aged 15-85 years, on the waiting list for a knee arthroplasty for osteoarthritis, and met at least one of the thresholds for one of the four components of the preoperative package of non-operative care intervention (ie, weight loss, exercise therapy, use of insoles, and analgesia adjustment). Participants were randomly assigned (2:1) to either the intervention group or the standard of care (ie, control) group. All four aspects of the intervention were delivered weekly over 12 weeks. Participants in the intervention group were reviewed regularly to assess adherence. The primary outcome was acceptability and feasibility of delivering the intervention, as measured by recruitment rate, retention rate at follow-up review after planned surgery, health-related quality of life, joint-specific scores, and adherence (weight change and qualitative interviews). This study is registered with ISRCTN, ISRCTN96684272. FINDINGS: Between Sept 3 2018, and Aug 30, 2019, we screened 233 patients, of whom 163 (73%) were excluded and 60 (27%) were randomly assigned to either the intervention group (n=40) or the control group (n=20). 34 (57%) of 60 participants were women, 26 (43%) were men, and the mean age was 66·8 years (SD 8·6). Uptake of the specific intervention components varied: 31 (78%) of 40 had exercise therapy, 28 (70%) weight loss, 22 (55%) analgesia adjustment, and insoles (18 [45%]). Overall median adherence was 94% (IQR 79·5-100). At the final review, the intervention group lost a mean of 11·2 kg (SD 5·6) compared with 1·3 kg (3·8) in the control group (estimated difference -9·8 kg [95% CI -13·4 to -6·3]). A clinically significant improvement in health-related quality o life (mean change 0·078 [SD 0·195]) were reported, and joint-specific scores showed greater improvement in the intervention group than in the control group. No adverse events attributable to the intervention occurred. INTERPRETATION: Participants adhered well to the non-operative interventions and their health-related quality of life improved. Participant and health professional feedback were extremely positive. These findings support progression to a full-scale effectiveness trial. FUNDING: Versus Arthritis.


Analgesia , Osteoarthritis , Aged , Female , Humans , Male , Feasibility Studies , Osteoarthritis/therapy , Quality of Life , Weight Loss
14.
Article En | MEDLINE | ID: mdl-38341188

The search for alternatives to live animal sentinels in rodent health monitoring programs is fundamental to the 3Rs (Reduction, Replacement, and Refinement) of animal research. We evaluated the efficacy of a novel battery-operated tumbler device that rotates soiled bedding in direct contact with sample media against the use of exhaust sample media and soiled bedding sentinel (SBS) mice. Four rodent racks were used, each with 3 test cages: a cage with a tumbler device that rotated for 10 min twice a week (TUM10), a cage with a tumbler device that rotated for 60 min twice a week (TUM60), and a cage housing 2 female Crl:CD1(ICR) mice. Every 2 wk, each test cage received soiled bedding collected from all cages on each respective rack. In addition to soiled bedding, the tumbler device contained various sample collection media: a contact Reemay filter (3 mo-cRF) that remained in the tumbler for the duration of the study, a contact Reemay filter (1 mo-cRF) that was replaced monthly, adhesive swabs (AS) that were added at every biweekly cage change, and an exhaust Reemay filter located at the exhaust outlet of the cage. All analyses were performed by direct PCR for both sample media in the animal-free methods, and fecal pellet, body swab, and oral swabs were collected from sentinel mice. Out of 16 total pathogens detected, assessment of 1 mo-cRF from both TUM10 and TUM60 cages detected 84% and 79% of pathogens, respectively, while SBS samples detected only 47% of pathogens. AS in TUM60 and TUM10 cages detected the fewest pathogens (24% and 13%, respectively). These results indicate that the novel tumbler device is an effective and reliable tool for rodent health monitoring programs and a suitable replacement for live animal sentinels. In this study, 1 mo-cRF in TUM10 cages detected the highest number of pathogens.

15.
Microorganisms ; 12(2)2024 Feb 16.
Article En | MEDLINE | ID: mdl-38399796

Benzalkonium chloride (BC) is widely used for disinfection in the food industry. However, Listeria monocytogenes strains with resistance to BC have been reported recently. In L. monocytogenes, the Agr communication system consists of a membrane-bound peptidase AgrB, a precursor peptide AgrD, a histidine kinase (HK) AgrC, and a response regulator (RR) AgrA. Our previous study showed that the agr genes are significantly upregulated by BC adaptation. This study aimed to investigate the role of the Agr system in BC resistance in L. monocytogenes. Our results showed that the Agr system was involved in BC resistance. However, a direct interaction between BC and AgrC was not observed, nor between BC and AgrA. These results indicated that BC could induce the Agr system via an indirect action. Both AgrBD and AgrC were required for growth under BC stress. Nevertheless, when exposed to BC, the gene deletion mutant ∆agrA strain exhibited better growth performance than its parental strain. The RR Lmo1172 played a role in BC resistance in the ∆agrA strain, suggesting that Lmo1172 may be an alternative to AgrA in the phosphotransfer pathway. Phosphorylation of Lmo1172 by AgrC was observed in vitro. The cognate HK Lmo1173 of Lmo1172 was not involved in BC stress, regardless of whether it was as the wild-type or the ∆agrA mutant strain. Our evidence suggests that the HK AgrC cross-phosphorylates its noncognate RR Lmo1172 to cope with BC stress when the cognate RR AgrA is absent. In vivo, further studies will be required to detect phosphotransfer of AgrC/AgrA and AgrC/Lmo1172.

16.
Spine (Phila Pa 1976) ; 49(11): 743-751, 2024 Jun 01.
Article En | MEDLINE | ID: mdl-38375611

STUDY DESIGN: Retrospective review of prospectively collected data. OBJECTIVE: To investigate the effect of lower extremity osteoarthritis on sagittal alignment and compensatory mechanisms in adult spinal deformity (ASD). BACKGROUND: Spine, hip, and knee pathologies often overlap in ASD patients. Limited data exists on how lower extremity osteoarthritis impacts sagittal alignment and compensatory mechanisms in ASD. PATIENTS AND METHODS: In total, 527 preoperative ASD patients with full body radiographs were included. Patients were grouped by Kellgren-Lawrence grade of bilateral hips and knees and stratified by quartile of T1-Pelvic Angle (T1PA) severity into low-, mid-, high-, and severe-T1PA. Full-body alignment and compensation were compared across quartiles. Regression analysis examined the incremental impact of hip and knee osteoarthritis severity on compensation. RESULTS: The mean T1PA for low-, mid-, high-, and severe-T1PA groups was 7.3°, 19.5°, 27.8°, and 41.6°, respectively. Mid-T1PA patients with severe hip osteoarthritis had an increased sagittal vertical axis and global sagittal alignment ( P <0.001). Increasing hip osteoarthritis severity resulted in decreased pelvic tilt ( P =0.001) and sacrofemoral angle ( P <0.001), but increased knee flexion ( P =0.012). Regression analysis revealed that with increasing T1PA, pelvic tilt correlated inversely with hip osteoarthritis and positively with knee osteoarthritis ( r2 =0.812). Hip osteoarthritis decreased compensation through sacrofemoral angle (ß-coefficient=-0.206). Knee and hip osteoarthritis contributed to greater knee flexion (ß-coefficients=0.215, 0.101; respectively). For pelvic shift, only hip osteoarthritis significantly contributed to the model (ß-coefficient=0.100). CONCLUSIONS: For the same magnitude of spinal deformity, increased hip osteoarthritis severity was associated with worse truncal and full body alignment with posterior translation of the pelvis. Patients with severe hip and knee osteoarthritis exhibited decreased hip extension and pelvic tilt but increased knee flexion. This examines sagittal alignment and compensation in ASD patients with hip and knee arthritis and may help delineate whether hip and knee flexion is due to spinal deformity compensation or lower extremity osteoarthritis.


Osteoarthritis, Hip , Osteoarthritis, Knee , Humans , Male , Female , Osteoarthritis, Knee/diagnostic imaging , Osteoarthritis, Knee/physiopathology , Osteoarthritis, Knee/surgery , Middle Aged , Osteoarthritis, Hip/diagnostic imaging , Osteoarthritis, Hip/physiopathology , Aged , Retrospective Studies , Adult , Spinal Curvatures/diagnostic imaging , Spinal Curvatures/physiopathology , Radiography
17.
Nat Ecol Evol ; 8(2): 293-303, 2024 Feb.
Article En | MEDLINE | ID: mdl-38191839

Top predator declines are pervasive and often have dramatic effects on ecological communities via changes in food web dynamics, but their evolutionary consequences are virtually unknown. Tasmania's top terrestrial predator, the Tasmanian devil, is declining due to a lethal transmissible cancer. Spotted-tailed quolls benefit via mesopredator release, and they alter their behaviour and resource use concomitant with devil declines and increased disease duration. Here, using a landscape community genomics framework to identify environmental drivers of population genomic structure and signatures of selection, we show that these biotic factors are consistently among the top variables explaining genomic structure of the quoll. Landscape resistance negatively correlates with devil density, suggesting that devil declines will increase quoll genetic subdivision over time, despite no change in quoll densities detected by camera trap studies. Devil density also contributes to signatures of selection in the quoll genome, including genes associated with muscle development and locomotion. Our results provide some of the first evidence of the evolutionary impacts of competition between a top predator and a mesopredator species in the context of a trophic cascade. As top predator declines are increasing globally, our framework can serve as a model for future studies of evolutionary impacts of altered ecological interactions.


Marsupialia , Animals , Marsupialia/genetics , Metagenomics , Population Dynamics , Food Chain
18.
BMC Res Notes ; 17(1): 36, 2024 Jan 24.
Article En | MEDLINE | ID: mdl-38268014

OBJECTIVE: With an increasingly ageing population and osteoarthritis prevalence, the quantification of nociceptive signals responsible for painful movements and individual responses could lead to better treatment and monitoring solutions. Changes in electrodermal activity (EDA) can be detected via changes in skin conductance (SC) and measured using finger electrodes on a wearable sensor, providing objective information for increased physiological stress response. RESULTS: To provide EDA response preliminary data, this was recorded with healthy volunteers on an array of activities while receiving a noxious stimulus. This provides a defined scenario that can be utilised as protocol feasibility testing. Raw signal extraction, processing and statistical analysis was performed using mean SC values on all participant data. The application of the stimuli resulted in a significant average increase (p < 0.05) in mean SC in four out of five activities with significant gender differences (p < 0.05) in SC and self-reported pain scores and large effect sizes. Though EDA parameters are a promising tool for nociceptive response indicators, limitations including motion artifact sensitivities and lack of previous movement-based EDA published data result in restricted analysis understanding. Refined processing pipelines with signal decomposition tools could be utilised in a protocol that quantifies nociceptive response clinically meaningfully.


Galvanic Skin Response , Nociception , Humans , Movement , Aging , Electrodes
19.
Eur Heart J ; 45(8): 601-609, 2024 Feb 21.
Article En | MEDLINE | ID: mdl-38233027

BACKGROUND AND AIMS: Predicting personalized risk for adverse events following percutaneous coronary intervention (PCI) remains critical in weighing treatment options, employing risk mitigation strategies, and enhancing shared decision-making. This study aimed to employ machine learning models using pre-procedural variables to accurately predict common post-PCI complications. METHODS: A group of 66 adults underwent a semiquantitative survey assessing a preferred list of outcomes and model display. The machine learning cohort included 107 793 patients undergoing PCI procedures performed at 48 hospitals in Michigan between 1 April 2018 and 31 December 2021 in the Blue Cross Blue Shield of Michigan Cardiovascular Consortium (BMC2) registry separated into training and validation cohorts. External validation was conducted in the Cardiac Care Outcomes Assessment Program database of 56 583 procedures in 33 hospitals in Washington. RESULTS: Overall rate of in-hospital mortality was 1.85% (n = 1999), acute kidney injury 2.51% (n = 2519), new-onset dialysis 0.44% (n = 462), stroke 0.41% (n = 447), major bleeding 0.89% (n = 942), and transfusion 2.41% (n = 2592). The model demonstrated robust discrimination and calibration for mortality {area under the receiver-operating characteristic curve [AUC]: 0.930 [95% confidence interval (CI) 0.920-0.940]}, acute kidney injury [AUC: 0.893 (95% CI 0.883-0.903)], dialysis [AUC: 0.951 (95% CI 0.939-0.964)], stroke [AUC: 0.751 (95%CI 0.714-0.787)], transfusion [AUC: 0.917 (95% CI 0.907-0.925)], and major bleeding [AUC: 0.887 (95% CI 0.870-0.905)]. Similar discrimination was noted in the external validation population. Survey subjects preferred a comprehensive list of individually reported post-procedure outcomes. CONCLUSIONS: Using common pre-procedural risk factors, the BMC2 machine learning models accurately predict post-PCI outcomes. Utilizing patient feedback, the BMC2 models employ a patient-centred tool to clearly display risks to patients and providers (https://shiny.bmc2.org/pci-prediction/). Enhanced risk prediction prior to PCI could help inform treatment selection and shared decision-making discussions.


Acute Kidney Injury , Percutaneous Coronary Intervention , Stroke , Humans , Percutaneous Coronary Intervention/methods , Patient Preference , Treatment Outcome , Renal Dialysis , Risk Factors , Hemorrhage/etiology , Machine Learning , Stroke/etiology , Acute Kidney Injury/etiology , Risk Assessment/methods
20.
Disabil Rehabil ; 46(4): 637-649, 2024 Feb.
Article En | MEDLINE | ID: mdl-36772816

PURPOSE: Although a well-established aspect of healthcare practice, the impact of facemasks on verbal communication is surprisingly ambiguous. MATERIALS AND METHODS: A systematic search was conducted in APA PSYCHinfo, CINAHL, NHS Knowledge Network, Medline and SPORTDiscus databases from inception to November 2022 according to the PRISMA guidelines. Studies reporting an objective measure of speech understanding in adults, where information was transmitted or received whilst wearing a facemask were included. Risk of bias of included studies was assessed with the Newcastle-Ottawa score. RESULTS: Four hundred and thirty-three studies were identified, of which fifteen were suitable for inclusion, incorporating 350 participants with a median age of 49 (range 19 to 74) years. Wide heterogeneity of test parameters and outcome measurement prohibited pooling of data. 93% (14 of 15) studies reported a deleterious effect of facemasks on speech understanding, and 100% (5 of 5) of the included studies reported attenuation of sound with facemask usage. Background noise added further deleterious effects on speech understanding which was particularly problematic within hearing-impaired populations. Risk of bias in included studies varied but overall was modest. CONCLUSIONS: Despite considerable complexity and heterogeneity in outcome measure, 93% (14 of 15) articles suggest respiratory protective equipment negatively affects speech understanding in normal hearing and hearing-impaired adults.


As a result of the covid-19 pandemic, facemask use is now commonplace across all healthcare and rehabilitation settings and has material implications for interpersonal communication.This systematic review of human communicative studies highlights that the use of facemasks does indeed inhibit communication through effects on speech intelligibility and through sound attenuation.These effects are evident in both normal hearing and hearing-impaired adults due to the visual cues required with lipreading and facial expressions during communication.The presence of background noise also produces deleterious effects on speech understanding and is more problematic for hearing-impaired populations.Simple recommendations to reduce background noise (where possible), to step closer (where social-distancing rules permit), to speak louder or to use speech to text applications (if practical) could all mitigate these communicative barriers. Further an awareness of persons with hearing impairments, the function (or otherwise) of hearing aids in those patients that require these, and an ability to use transparent facemasks can be specifically helpful.


Hearing Loss , Speech Perception , Adult , Humans , Young Adult , Middle Aged , Aged , Masks , Noise , Communication
...